Second-order Stein: SURE for SURE and other applications in high-dimensional inference
نویسندگان
چکیده
Stein’s formula states that a random variable of the form z⊤f(z)−divf(z) is mean-zero for all functions f with integrable gradient. Here, divf divergence function and z standard normal vector. This paper aims to propose second-order Stein characterize variance such variables f(z) square gradient, demonstrate usefulness this in various applications. In Gaussian sequence model, remarkable consequence Unbiased Risk Estimate (SURE), an unbiased estimate mean squared risk almost any given estimator μˆ unknown A first application SURE itself (SURE SURE): providing information about distance between estimation error μˆ. has simple as data applicable example, Lasso Elastic Net. addition SURE, following statistical applications are developed: (1) upper bounds on when target error; (2) confidence regions based using formula; (3) oracle inequalities satisfied by SURE-tuned estimates under mild Lipschtiz assumption; (4) bound size model selected Lasso, more generally empirical degrees-of-freedom convex penalized estimators; (5) explicit expressions Net; (6) linear general semiparametric scheme de-bias differentiable initial inference low-dimensional projection regression coefficient vector, characterization after debiasing; (7) accuracy analysis Monte Carlo approximate f:Rn→Rn.
منابع مشابه
6 Sure Independence Screening for Ultra - High Dimensional Feature Space ∗
High dimensionality is a growing feature in many areas of contemporary statistics. Variable selection is fundamental to high-dimensional statistical modeling. For problems of large or huge scale pn, computational cost and estimation accuracy are always two top concerns. In a seminal paper, Candes and Tao (2007) propose a minimum l1 estimator, the Dantzig selector, and show that it mimics the id...
متن کاملOn Time – For Sure
When a computer takes forever to load a website, it may be annoying, but it is nothing more serious than that. If, however, the electronics in a car or a plane don’t process commands exactly when they are supposed to, the consequences can be fatal. Björn Brandenburg and his team at the Max Planck Institute for Software Systems in Kaiserslautern and Saarbrücken study how to construct real-time s...
متن کاملExSIS: Extended Sure Independence Screening for Ultrahigh-dimensional Linear Models
Statistical inference can be computationally prohibitive in ultrahigh-dimensional linear models. Correlation-based variable screening, in which one leverages marginal correlations for removal of irrelevant variables from the model prior to statistical inference, can be used to overcome this challenge. Prior works on correlation-based variable screening either impose strong statistical priors on...
متن کاملRejoinder: Sure independence screening for ultrahigh dimensional feature space
We are very grateful to all contributors for their stimulating comments and questions on the role of variable screening and selection on high-dimensional statistical modeling. This paper would not have been in the current form without the benefits of private communications with Professors Peter Bickel, Peter Bühlmann, Eitan Greenshtein, Qiwei Yao, Cun-Hui Zhang and Wenyang Zhang at various stag...
متن کاملSure independence screening for ultrahigh dimensional feature space
High dimensionality is a growing feature in many areas of contemporary statistics. Variable selection is fundamental to high-dimensional statistical modeling. For problems of large or huge scale pn, computational cost and estimation accuracy are always two top concerns. In a seminal paper, Candes and Tao (2007) propose a minimum l1 estimator, the Dantzig selector, and show that it mimics the id...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Annals of Statistics
سال: 2021
ISSN: ['0090-5364', '2168-8966']
DOI: https://doi.org/10.1214/20-aos2005